Telegram Group & Telegram Channel
πŸ€ The Hardware Lottery 🎰
by Sarah Hooker, Google Brain [ACM]

- The very first computer hardware was extremely focused on solving one particular problem - numerical differentiation or polynomial models. In the 1960s IBM invented the concept of Instruction Set and made migration between hardware easier for software developers. Till the 2010s we have been living in the world of general-purpose hardware - CPUs.

- Computer Science Ideas win or lose not because one superior one to another, but because some of them did not have the suitable hardware to be implemented in. Back Propagation Algorithm, the key algorithm that made the deep learning revolution possible, was invented independently in 1963, 1976, 1988 and finally applied to CNN in 1989. However, it was only three decades later that deep neural networks were widely accepted as a promising research direction and the significant result was achieved with GPUs, that could run massive parallel computations.

- Today hardware pendulum is swinging back to domain-specific hardware like it was the CPU invention

- Hardware should not remain a limiting factor for the breakthrough ideas in AI research. Hardware and Software should be codesigned for the SOTA algorithms. Algorithm developers need a deeper understanding of the computer platforms.

read also here



tg-me.com/pdp11ml/352
Create:
Last Update:

πŸ€ The Hardware Lottery 🎰
by Sarah Hooker, Google Brain [ACM]

- The very first computer hardware was extremely focused on solving one particular problem - numerical differentiation or polynomial models. In the 1960s IBM invented the concept of Instruction Set and made migration between hardware easier for software developers. Till the 2010s we have been living in the world of general-purpose hardware - CPUs.

- Computer Science Ideas win or lose not because one superior one to another, but because some of them did not have the suitable hardware to be implemented in. Back Propagation Algorithm, the key algorithm that made the deep learning revolution possible, was invented independently in 1963, 1976, 1988 and finally applied to CNN in 1989. However, it was only three decades later that deep neural networks were widely accepted as a promising research direction and the significant result was achieved with GPUs, that could run massive parallel computations.

- Today hardware pendulum is swinging back to domain-specific hardware like it was the CPU invention

- Hardware should not remain a limiting factor for the breakthrough ideas in AI research. Hardware and Software should be codesigned for the SOTA algorithms. Algorithm developers need a deeper understanding of the computer platforms.

read also here

BY PDP-11πŸš€


Warning: Undefined variable $i in /var/www/tg-me/post.php on line 283

Share with your friend now:
tg-me.com/pdp11ml/352

View MORE
Open in Telegram


PDP 11 Telegram | DID YOU KNOW?

Date: |

A Telegram spokesman declined to comment on the bond issue or the amount of the debt the company has due. The spokesman said Telegram’s equipment and bandwidth costs are growing because it has consistently posted more than 40% year-to-year growth in users.

Newly uncovered hack campaign in Telegram

The campaign, which security firm Check Point has named Rampant Kitten, comprises two main components, one for Windows and the other for Android. Rampant Kitten’s objective is to steal Telegram messages, passwords, and two-factor authentication codes sent by SMS and then also take screenshots and record sounds within earshot of an infected phone, the researchers said in a post published on Friday.

PDP 11 from in


Telegram PDP-11πŸš€
FROM USA